Skip to main content

Full text of "NASA Technical Reports Server (NTRS) 20050232740: Experimental Validation: Subscale Aircraft Ground Facilities and Integrated Test Capability"

See other formats


Experimental Validation: Subscale Aircraft Ground 
Facilities and Integrated Test Capability 

Roger M. Bailey*, Robert W. Hostetler^ Kevin N. Barnes*, 

Celeste M. Belcastro § , and Christine M. Belcastro** 

NASA Langley Research Center, Hampton VA 23681 


Experimental testing is an important aspect of validating complex integrated safety- 
critical aircraft technologies. The Airborne Subscale Transport Aircraft Research 
(AirSTAR) Testbed is being developed at NASA Langley to validate technologies under 
conditions that cannot be flight validated with full-scale vehicles. The AirSTAR capability 
comprises a series of flying sub-scale models, associated ground-support equipment, and a 
base research station at NASA Langley. The subscale model capability utilizes a generic 
5.5% scaled transport class vehicle known as the Generic Transport Model (GTM). The 
AirSTAR Ground Facilities encompass the hardware and software infrastructure necessary 
to provide comprehensive support services for the GTM testbed. The ground facilities 
support remote piloting of the GTM aircraft, and include all subsystems required for 
data/video telemetry, experimental flight control algorithm implementation and evaluation, 
GTM simulation, data recording/archiving, and audio communications. The ground 
facilities include a self-contained, motorized vehicle serving as a mobile research 
command/operations center, capable of deployment to remote sites when conducting GTM 
flight experiments. The ground facilities also include a laboratory based at NASA LaRC 
providing near identical capabilities as the mobile command/operations center, as well as the 
capability to receive data/video/audio from, and send data/audio to the mobile 
command/operations center during GTM flight experiments. 


1.0 Introduction 

The Single Aircraft Accident Prevention Project of the NASA Aviation Safety and Security Program (AvSSP) 
is developing technologies to reduce aircraft accidents resulting from loss of vehicle control as well as failures [1]. 
Loss-of-control is among the highest accident categories across all vehicle classes for both the number of accidents 
and the number of fatalities. Although many loss-of-control accidents include system or component failures as a 
primary or secondary causal factor, aircraft accidents due to failures have also resulted in a significant number of 
fatal accidents that can be listed as a separate category. The research goal is to make revolutionary enhancements to 
aviation safety by improving the ability to monitor vehicle health and to safely manage fault and failure conditions 
in aircraft systems, and to prevent and recover from aircraft loss-of-control conditions. The scope of research 
encompasses all aspects of the hardware and software life cycle, from analysis of design specifications, 
implementation and verification, to validation methods to ensure design correctness, fault tolerance, and functional 
integrity in the operating environment. A validation process is currently under development that includes a 
coordinated approach to analysis, simulation, and experimental test methods [2] - [3]. Experimental testing includes 
the Airborne Sub-scale Transport Aircraft Research (AirSTAR) testbed capability that is being developed to validate 
technologies that cannot be flight validated with full-scale vehicles. Dynamics modeling and simulation 
technologies for adverse and upset flight conditions require flight validation at extreme and upset conditions, as well 
as unusual flight maneuvers within the normal flight envelope. Detection and identification technologies for failures 
and damage under adverse and upset flight conditions will require flight validation under normal, extreme, and upset 
flight conditions. Upset recovery algorithms, reconfigurable control for failures and damage, and adaptive outer- 


Senior Research Engineer, Safety-Critical Avionics Systems, 1 S. Wright St., Mail Stop 130. 

* Research Engineer, Safety-Critical Avionics Systems, 1 S. Wright St., Mail Stop 130. 

* Research Engineer, Safety-Critical Avionics Systems, 1 S. Wright St., Mail Stop 130. 

§ Senior Research Engineer and Technical Manager, Safety-Critical Avionics Systems, 1 S. Wright St., Mail Stop 130, AIAA 
Senior Member. 

Senior Research Engineer and Technical Manager, Dynamic Systems and Control, 8 Langley Blvd., Mail Stop 308, AIAA 
Senior Member. 

1 

American Institute of Aeronautics and Astronautics 



loop autopilot and trajectory management algorithms for damage and failure conditions will also require flight 
validation under extreme and upset flight conditions. These validation requirements are unsuitable for full-scale 
flight validation due to high risk, but can be satisfied using sub-scale models. 

Use of a Remotely Piloted Vehicle for conducting flight tests to validate control upset prevention and recovery 
technologies (including modeling and control methods for characterizing and recovering from upset conditions) will 
require a robust and reliable ground support system for pilots and researchers. This paper describes the ground 
facilities being developed to support the NASA Langley Research Center AirSTAR testbed. The AirSTAR aircraft 
is a 5.5% dynamically scaled, remotely piloted, Generic Transport Model (GTM) which will be used to conduct 
flight test research experiments pertaining to dynamics modeling and control beyond the normal flight envelope [4] 
- [5]. Figure 1 shows a functional diagram of the basic GTM-Ground Facilities (GTM-GF) architecture. 



Vida? 

Nese Cts&eru.) 


Ztaij, Vidcz 
Vida? 


$ 


Remote 

Communications 


A 


Data 

Storage 


<r- 


Ykrdt&e Cntdi 


\7 


GTM 

Simulation 




♦ Mobile Operationi 
Station Only 


R/C 

Tx/Rj: 

Safety 

Pilot 

Station 

★ 




A 


Telemetry 


-U> 


d$?5£ Csmem. 
Vida? 


Ground 

Computers 


A/c 

Sraft”. 

IvSeoi 


A/C 


I 


PiM 


V 7 


I 


V 


Pilot 


Researcher 


Flight 

Station 


Monitor 


Test Director 



Station 


Station 


i> 


At'CS&ey Alerts, 


Video 

Tracking 

System 


Flight Line 
Camera 






TYoetiig Vida? 
FR/ine Wda? 





Figure 1. Block Diagram of Ground Facilities Functionality 

The fundamental purpose of the GTM-GF is to provide the apparatus to accomplish the following: 


• Permit research pilots to control the GTM; 

• Receive, process, record and transmit GTM telemetry data; 

• Display information to pilots, researchers, systems engineers and observers; 

• Provide voice communications for experiment participants and observers; 

• Provide data processing systems capable of computing control parameters on the ground to augment pilot 
control and/or remotely drive aircraft control surfaces; 

• Provide real-time GTM simulation for training, testing and experiment reproduction. 

The GTM Ground Facilities are designed to align with the overall development of the NASA Langley Research 
Center Systems and Airframe Failure Emulation, Testing and Integration (SAFETI) Research Laboratory [6]. The 
SAFETI Lab is being developed for validation testing of AvSSP technologies and to support long-term Aviation 
Safety and Security Program experimental research goals and objectives. 

The GTM-GF will ultimately be the conjugation of two separate resources [7] - [8]: (1) the Mobile Operations 
Station (MOS) - a self-contained, motorized vehicle serving as a mobile research command/operations center, 
capable of deployment to remote sites when conducting GTM flight experiments [9]; and (2) the Base Research 
Station (BRS) - a laboratory based at the NASA Langley Research Center providing near-identical capabilities as 
the MOS. Figures 2 and 3 below outline the basic differences between hardware systems in the BRS and the MOS. 


2 

American Institute of Aeronautics and Astronautics 










Figure 2. MOS Hardware Systems Overview 



Figure 3. BRS Hardware Systems Overview 


The overall GTM-GF architecture will also include the capability to link the BRS and MOS (data, video and voice) 
during GTM flight experiments. The BRS will be used for experimental software development and testing. The 
BRS also includes a “hot bench” work area which will accommodate hardware-in-the-loop test and evaluation with 
the GTM. 

This paper provides a detailed description of the AirSTAR testbed ground facilities being developed at the 
NASA Langley Research Center. Section 2 describes the major hardware systems. Data archiving and playback is 
discussed in Section 3. Section 4 describes the Audio/Video Systems. The satellite-based data/video link is 
described in Section 5. A description of the GTM Video Tracking System is provided in Section 6. The Mobile 
Operations Station is described in Section 7. Section 8 discusses the Safety Pilot Station. Major software 
components are described in Section 9. A summary and some concluding remarks are provided in Section 10. 


3 

American Institute of Aeronautics and Astronautics 



2.0 Major Hardware Subsystems 


Major hardware subsystems include the air- ground data link, computational resources, and user interfaces. 

Each of these subsystems is discussed in the following subsections. 

2.1 Air-Ground Datalink 

Communications between the AirSTAR aircraft and the GTM-GF are via a duplex serial data link using 
microwave RF transceivers, with a separate safety-pilot backup command RF link (ref section 8.0). The current 
experiment for which the GTM-GF is being configured will conduct flight maneuvers within a test range airspace 
volume of approximately 2 cubic statue miles (2 miles x 1 mile x 1 mile). To maintain effective telemetry 
communications coverage, the GTM-GF will employ a directionally-tracked, circularly-polarized antenna system. 
Analysis of the anticipated RF budget (involving transmitter powers, receiver noise floors, and antenna gains) 
supports the desired operation of the microwave links to slant ranges up to 2 statue miles with power margins in 
excess of 15 decibels. The AirSTAR telemetry system evolved from an air-to-ground datalink system developed by 
Microbotics Inc., and utilizes a proprietary encoding scheme based on the IRIG-106 standard. The telemetry system 
is comprised of a custom built airborne unit installed in the GTM and a custom built ground unit interfaced to the 
GTM-GF [10] -[12]. 

Telemetry Airborne Unit (TA U) 

The TAU’s primary function is to perform the multiplexing and de-multiplexing of the telemetry data for the GTM’s 
on-board instrumentation and control systems. The on-board TAU is microprocessor-based, and incorporates 
hardware to assist in aircraft operations and instrumentation, including signal interface boards, safety switch, and 
power supplies. The TAU acquires data from various sensors aboard the GTM, including transducers indicating 
flight surfaces position, angle-of-attack probes, engine control units (ECU), MIDG 11^, battery status, and employs 
a ‘Flight Controller’ module to multiplex this data for the air-to-ground telemetry stream. Conversely, the ‘Flight 
Controller’ module is used to de-multiplex the ground-to-air telemetry stream, distributing commands to GTM 
subsystems, including the flight surface servos. Additionally, video from the GTMs’ nose camera will be 
transmitted via the RF link to the ground. 

Telemetry Ground Unit (TGU) 

The TGU’s primary function is to perform the multiplexing and de-multiplexing of the telemetry data for GTM-GF 
client computers, and is housed in a standard 19” rack-mountable enclosure. The TGU interfaces to the GTM-GF 
computing system via four asynchronous serial (RS-422) ports that are used to receive flight data from the GTM and 
transmit control surface servo commands to the GTM. These ports, each running at 115200 baud, parse the 
telemetry data stream as shown in Tables 1 and 2. 


Table 1: Ports for Telemetry from GTM to GTM-GF (Downlink) 


(Downlinked on S-band frequency 2240.5 MHz) 

port 0 

16 Analog signals per port (48 
total) 

8 digital inputs each portO and 
portl; 16 digital inputs port2 
(32 total) 

1 serial stream each portO and 
portl for engine control units 
(2 total) 

port 1 

includes surface positions, 

port 2 

angles of attack, air speed, 
battery status, etc 

includes system status codes, 
gear up/down, etc. 

codes from left and right 
ECU’s 

port 3 

1 serial stream from the MIDG II inertial package 


Additionally, the TGU receiver uses an RF sideband to receive video from the GTM nose camera, with this received 
video signal present at a rear-panel BNC connector. 


ft The MIDG II is a product from Microbotics Inc. which outputs GTM rotational rates, attitude, accelerations, GPS velocity and 
position, inertial navigation velocity and position (filtered velocity and position from GPS measurements and inertial 
measurements), magnetometer data, and GPS satellite data, all in a very small, lightweight package. 

4 

American Institute of Aeronautics and Astronautics 




Table 2: Ports for Telemetry from GTM-GF to GTM (Uplink) 


(Uplinked on L-band frequency 1835.5 MHz) 

port 0 

16 Pulse Width Modulated (PWM) signals each (32 total) 
servo commands 

port 1 

port 2 

16 digital output level commands for the Microbotics Servo Switch/Controller (SSC)JJ 
This message is ignored at the airborne unit if SSC auxiliary channel defined as a digital input 

port 3 

RTCM DGPS corrections to GTM 


The rate for GTM-to-Ground telemetry (excluding MIDG II) data is configurable within a range of 200 to 250 Hz, 
and for AirSTAR operations is set to 216Hz to accomplish the required data rates. The MIDG II provides INS data 
at 50Hz. Consequently, a ‘subframe’ period of approximately 4.6ms is established for parameters downlinked on 
ports 0-2, while a ‘major frame’ period of 20ms is established for MIDG II data downlinked on port 3. 

Calculations indicate that downlink data from ports 0-2 (40 Bytes per port) will require 2.77ms (60% of a minor 
frame) to arrive at GTM-GF computers. MIDG II INS data downlinked on port 3 (160 Bytes) are estimated to 
require approximately 1 1.1 1ms (2.37 subframes = 55% of a major frame) to arrive at GTM-GF computers. 

Calculations indicate that uplink data to ports 0-2 (< 38 Bytes per port) will require 2.64ms (57% of a minor frame) 
to be received into the GTU. Thus, GTM-GF computing systems must process all parameters within an estimated 
20ms - (1 1.1 1ms + 2.64ms) = 6.25ms. 

2.2 Computational Resources 

Scaled flight research has demanding data system frame timing deadlines that require the deterministic program 
execution rate provided by a real-time operating system (RTOS). The GTM-GF computing system is capable of 
accomplishing three primary tasks: 

1. receiving telemetered GTM state data (e.g. sensor outputs, discretes, etc.), converting these data to 
calibrated engineering units in real-time, and transmitting control-surface and engine commands and other 
required state data to the GTM via the telemetry system; 

2. processing, in real-time, researcher-provided algorithms, including 

• flight control algorithms to compute control- surface and engine commands in response to pilot 
inputs and/or aircraft state data; 

• failure detection and isolation (FDI) algorithms, upset recovery algorithms, and guidance 
algorithms; 

3. providing real-time, hardware-in-the-loop (HIL) simulation of the GTM, including the capacity for 
integrating synthetic vision display components and terrain databases for flight control by the research 
pilot. 

Non-time-critical computing tasks such as generation of user displays are managed by Intel® /Windows®-based 
Personal Computers. 

Time critical computing tasks will be managed by the dSPACE® computing platform. The dSPACE Real-Time 
Operating System (RTOS) was chosen to control the GTM while flying research control laws. dSPACE, which 


The Microbotics Servo Switch/Controller (SSC) is a highly configurable, multiplexing switch for servo command signals. It 
allows the dynamic selection of several sources of pulse signals for servos. Pulse sources include asynchronous serial 
communication packets from a control computer, pulse signals from a conventional RC receiver or other servo pulse generator, 
and user definable constant signals. 


5 

American Institute of Aeronautics and Astronautics 




integrates target processor hardware, C code, analog and digital input/output, and control law block diagrams, was 
chosen because it was based on Matlab® and Simulink® products developed by The Mathworks, Inc. The dSPACE 
platform is a good fit with existing commercial aircraft simulations based on tools developed by The Mathworks, 
Inc. Matlabs’ plotting and scripting features facilitate manipulation of wind tunnel data sets to obtain aerodynamic 
data parameters and verify control algorithms against check data. The graphical data flow characteristics of 
Simulink are a close fit to the characteristics of control law diagrams in aerospace documentation. The use of 
dSPACE reduces the amount of custom programming necessary to integrate the aircraft telemetry subsystems. The 
commonality between dSPACE and the simulation tools facilitates exercising the software at each stage of 
development with a realistic execution environment, and rapidly testing various HIL configurations. 

2.3 User Interfaces 

GTM-GF operations will be conducted by personnel located at three primary work areas: the Flight Research 
Station, the Operations Command Station, and Operations Engineering Station. Each of these work areas are 
described below. 

2.3.1 Flight Research Station ( FRS ) 

The GTM pilot and research projects’ principal investigator (PI) will be located in the FRS. This station 
provides the means to fly the GTM manually or automatically by engagement of a researcher-provided autopilot 
flight control system implemented in the dSPACE computer. The FRS will provide all necessary pilot controls and 
a variety of displays for the pilot and PI. Figures 4 and 5 show photographs of the FRS and BRS hardware, 
respectively. 



Figure 4. Flight Research Station in BRS 



Figure 5. Photograph of BRS subsystem hardware - note view into “hotbench” area 


6 

American Institute of Aeronautics and Astronautics 



The display layout for the FRS is shown in Figure 6. 


Research Parameter Time History 


Primary Research 
Pilot Display 


Display 



Backup 
Primary Flight 
Display 


Navigation 

Display 

Tablet PC 


Display 



Flight Card 


Figure 6. Display Layout for the Flight Research Station 


Displays generated for the FRS will include the Primary Research Pilot Display, the Navigation Display, as well as 
other displays. These displays are described below. 

GTM Primary Research Pilot Display (PD) 

The PD will serve as the pilot’s primary indicator of flight information, incorporating display elements typically 
presented as part of the Primary Flight Display (PFD) found on modem commercial and business aircraft. The PD 
provides GTM flight information, including but not limited to, attitude and heading, airspeed, altitude, track angle, 
flight path angle, load factor, angle-of-attack, and sideslip angle. The PD can also be configured to integrate 
synthetic terrain or nose camera video into its background, and provide programmable primary display symbology 
for the research pilot. Figure 7 shows an example of a potential PD display format with a synthetic vision 
background. 



Figure 7. Prototype GTM Primary Research Pilot Display 


7 

American Institute of Aeronautics and Astronautics 





GTM Navigation Display ( ND ) 

The ND will serve as the pilot’s primary indicator of situational flight information, incorporating display elements 
typically presented as part of the Navigation Display found on modern commercial and business aircraft. The ND 
will serve as the pilot’s primary indicator of the GTMs’ location during flight tests, and can be configured to provide 
a variety of indicators and alerts, including a test range map with boundaries, runway depiction, vehicle track angle, 
desired horizontal path waypoints, and predicted path in a specified time period. Figure 8 shows an example of a 
potential ND format. 





Flight Path 
Predictor Noodle 


Figure 8. Prototype GTM Navigation Display for the Research Pilot Station 


Other Displays 

In addition to the Primary Research Pilot Display and Navigation Display, the GTM-GF provides for the generation 
and distribution of appropriate graphical displays for all flight test personnel including: 

• any available GTM state data and/or experiment parameters 

• camera video (e.g. tracking system, GTM nose camera, other on-board and ground-based cameras) 

• any available health and/or status information associated with GTM and GTM-GF subsystems (e.g. on- 
board battery power, TM link signal strength, weather data, etc.) 


2.3.2 Operations Command Station (PCS) 

The Flight Test Director, who has the responsibility of coordinating all flight test related activities, and another 
researcher will be located in this area. The OCS will be provided with six display units, and the capability to 
configure and select among all available video sources as desired. The OCS will provide for monitoring and control 
of all audio communications between flight test participants. The OCS will also provide the capability to conduct 
researcher-specific activities such as: 

• selecting test parameters for display; 

• setting test parameters that control test conditions; 

• setting simulated fault conditions; 

• performing pre/post-flight analysis of recorded flight data, including running the GTM simulation. 

Figures 9 and 10 show the general layout for the OCS and a photograph, respectively. 


8 

American Institute of Aeronautics and Astronautics 


LCD Flat panel 
Displays 


□ 

□ 


t ^ 




\ 


Video Select 



v^q'Ed-.; 




Kbd Mouse 


Figure 9. General Layout for Operations Command Station and Operations Engineering Stations 



Figure 10. Photograph of Operations Command Station and Operations Engineering Stations in BRS 


2.3.3 Operations Engineering Station (OES) 

Hardware and software support engineers will be located in this area. These support engineers will be 
responsible for monitoring and maintaining GTM-GF hardware and software subsystems during flight test 
operations. The OES will be similar in appearance to the OCS, and will have keyboard and mouse access to all 
GTM-GF computers, as well as the satellite terminal and data network (see Figures 9 and 10). 


3.0 Data Archiving and Playback 

Integral to the mission of AirSTAR scientific research is data collection and storage. The success of post-flight 
analysis hinges on the accurate and properly registered capture of GTM state parameters and GTM-GF process 
elements. With valid and complete data the stated goals of investigating new aerodynamic technologies can be more 
fully realized. The GTM-GF is equipped with an Asynchronous Real-Time Multiplexer and Output Reconstructor 
(ARMOR) subsystem. This system element consists of an Apogee Labs Model 4800 Digital Recorder Unit (DRU) 
interfaced to an Apogee Labs MITC FALCON Model 4303 Multiplexer /Demultiplexer. The multiplexing system 
merges multiple data sources, including GTM PCM telemetry data, 1 0 Mb/s Ethernet, voice, time code signals and 
video into a composite stream for digital recording. A plug-in Recorder Interface Module (RIM) provides a link to 
the DRU, which provides a recording interface to RAID 1 hard drives, with 90 Mb/s data transfer rate. 


9 

American Institute of Aeronautics and Astronautics 






This ARMOR system is configured to provide playback of all recorded GTM flight data parameters, physical 
camera video outputs, synthetic display video outputs, audio loops, and UTC time tag information, with partial or 
complete data reconstruction, including downlink/uplink telemetry to the GTM, camera video, audio loops, and 
master UTC information, allowing new real-time simulations to execute using recorded state data. Partial 
reconstruction provides the capability to perform engineering tests of the air-ground system, and debugging of 
anomalous behavior. 


4.0 Audio/Video Systems 

The GTM-GF Audio System is described in Section 4.1 and the Video System is described in Section 4.2. 

4.1 GTM-GF Audio System 

The GTM-GF provides the communications infrastructure to allow all key participants in GTM flight test 
activities to maintain voice contact. The heart of the intercom system is the Clear-Com Inc. MicroMatrix® 24- 
channel digital intercom system. Onboard slots for interface cards permit seamless interfacing with telephones, two- 
way radios, party-line intercoms, and wireless belt packs. Set up of point-to-point communications, groups, 
monitoring lines, and “virtual” party lines for all flight test participants is accomplished using interactive and menu- 
driven Windows-based software. The intercom master control system is operated by the flight test director during 
GTM flight test operations. 

4.2 GTM-GF Video System 

The GTM-GF provides the video hardware infrastructure to allow all video camera output signals and all computer 
generated displays to be amplified and split as necessary for distribution to key participants in GTM flight test 
activities. The heart of the GTM-GF video distribution system is Extron Electronics’ CrossPoint Plus 24x24 matrix 
video switch, an ultra- wideband analog RGBHV matrix switcher, capable of distributing any of 24 video inputs to 
any combination of 24 video outputs simultaneously. Additionally, a Crestron Inc. MC2E controller is interfaced to 
the video matrix switch, permitting users to operate the matrix switch from the Flight Research Station, Operations 
Command Station or the Operations Engineering Station. 


5.0 Satellite-based Data/Video Link 

In order to provide the capability to monitor MOS GTM flight test activities at the Base Research Station 
located at Langley Research Center, a satellite backbone transmission system will be utilized to (1) transmit GTM 
telemetry state data, optical video and voice communication from MOS to BRS, and (2) transmit desired support 
data and voice communications from BRS to MOS. The major communications system elements comprising the 
satellite system consist of a fixed base station satellite dish, a deployable remote station satellite dish, and the 
necessary interface equipment. The Base and Remote Stations employ a satellite backbone transmission system 
augmented by data, video and voice communication facilities, and an Internet Protocol (IP) routing system. The IP 
routing system provides for multiple communication transmission types and automatic routing path requirements. 
The base station consists of a 2.4 - meter dish antenna and rack mounted satellite, video, audio, and routing 
equipment. The remote station consists of a 1 .2-meter dish antenna contained in two field carrying cases, a field case 
for the antenna RF equipment, and satellite, video, audio, and routing equipment mounted in a separate field 
carrying case. The remote station is further augmented by a wireless video and audio transmission system, which 
provides for line of sight transmission of tracking video and voice communications from the Video Tracking Station. 

The satellite system will rely on an Immeon network communications link. The Immeon System is a joint 
venture between Loral Skynet and ViaSat. The Immeon System is unique in that it can supply connectivity between 
satellite stations via a satellite, in this case Telstar IV, on an as needed basis. This lowers satellite usage costs. The 
connection service is available on a 24 hours-per-day, 7 days-a-week basis and is provided by a teleport and network 
control system located in Atlanta, GA. The system communications are based on Internet Protocol (IP) addressing; 
fully automating all routing of video, data, and voice communications within the system. This is accomplished by 
using commercial off-the-shelf network routing equipment. The routing equipment will utilize telephony- switching 
equipment to accommodate voice communications. The camera optical data will be processed by a video encoder at 
the MOS site for satellite transmission, and translated back to video at the BRS site for display purposes. 


10 

American Institute of Aeronautics and Astronautics 



The satellite system will utilize two carrier frequencies within the U.S. Ku band, between 14.0 and 14.5 GHz. As 
this frequency is in a non-federal government band, it was coordinated through the Federal Communications 
Commission (FCC) and is subject to FCC jurisdiction. Consequently, there is a one-year duration of use with 
renewal options. 

The 2.4Meter fixed dish at the BRS will employ an 8 Watt power amplifier to achieve a 1.0 Mbps (1 Mbps = 
1,048,576 bits-per-second) uplink data rate. The 1.2Meter deployable dish at the MOS will employ a 25Watt power 
amplifier to achieve a 2.048Mbps uplink data rate. 

As the Telstar IV system is comprised of satellites located 22,000 miles above the equator in geosynchronous 
Earth orbit, closed-loop operations involving GTM control inputs generated at the BRS will incur a 0.5 sec 
(minimum) latency. For this reason, remote piloting of the GTM from the BRS is not currently planned. 


6.0 Video Tracking System 

In order to provide the capability to monitor the GTM in flight, a ground-based video tracking system will be 
deployed with the MOS and will track the airborne vehicle using cameras and electronic equipment. The tracking 
video could be used by safety pilots to help fly the GTM to the MOS if a takeover is required when the GTM is 
beyond the line-of-sight. This video tracking system will be augmented by a wireless video and audio transmission 
system, to provide for transmission of tracking video and voice communications from the video tracking system to 
the MOS. The trackers’ limited maximum slew rate (6 degrees per second estimated) dictates that the tracker be 
setup at a location offset one to two statute miles from the MOS, and within line-of-sight of the MOS wireless video 
receiver. This location will be called the Tracking System Station (TSS), and an operator will be required to operate 
the system. The TSS will employ a two-way communication link to the intercom master control system located at 
the Operations Command Station. The tracking system is capable of receiving GTM GPS/DGPS position data in 
order to augment the system’s ability to maintain GTM tracking lock, and tracking video will be transmitted to the 
MOS for display as desired by users at the various operations stations. 

The GTM Video Tracking System is a portable and fully self-contained system, designed and constructed by 
Celestial Computing Corp. The tracker can be disassembled and transported in six travel cases, each case having a 
maximum allowable weight of 120 pounds. The tracking system is comprised of a portable pier tripod assembly, 
supporting optical acquisition lenses and video cameras mounted on a motorized 2-axis gimballed system. The 
system includes an Automated Telescope Startup Unit (ATSU) with precision home and park positions to .001 
degrees, with electronic Periodic Error Correction (PEC) and Charge Coupled Device CCD port for astronomical 
self-guiding, and is capable of computerized full-sky "Go-To" pointing, which can be set up for remote or local 
operation. The tracker is capable of equatorial and alt-azimuth alignment and tracking, with adjustable slew speeds 
up to 6.0 degrees per second. This hardware is interfaced to a PC running customized software, which sends 
commands to the gimbal system to maintain tracker lock on the GTM. 


7.0 Mobile Operations Station (MOS) 

The Mobile Operation Station is a deployable remote laboratory site that will be employed as the central 
research and command center for GTM research flight operations. In addition to all of the hardware and software 
components utilized in the BRS, the MOS contains the ancillary support subsystems required for operation in the 
field, including, power generator, Un-interruptible Power Supply (UPS) back-up subsystem, restroom facility, 
kitchenette, small meeting/work area, walk-on roof with safety rails, leveling jacks, and a self- stowing, deployable 
side awning. In addition to all interfaces available in the BRS, the MOS provides external power and data interfaces 
for additional subsystems, including, Safety Pilot Station, TSS video modem and Flight Line video camera, wireless 
transmitter for GPS data, VHF and cellular communications devices, LAN/intemet, 1.2 meter portable satellite dish, 
and 120 VAC “Shore” power feeds. 

The GTM MOS is built on a Freightliner® chassis and has the following specifications: 

• GVWR- up to 54,000 lbs.; 

• Length - 40 feet; 

• Rear Axle - Tandem, up to 40,000 lbs.; 

• Wheelbase - 290”; 

• Engine: CAT C7, 330 Hp. 


11 

American Institute of Aeronautics and Astronautics 



The vehicle’s mission goal of providing a self-contained mobile research laboratory environment defines the design 
and minimum performance standard requirements of the Aft Body area. The Aft Body area is a self-contained/self- 
supporting laboratory consisting of four (4) sectioned off areas that will provide support to the three (3) users 
operations stations, four (4) full size equipment racks, two (2) half size equipment racks, and a Galley with a kitchen 
and lavatory area. Figure 1 1 shows the MOS floor plan. 



Figure 11. Mobile Operations Station Floor Plan View 


8.0 Safety Pilot Station 

In recognition of the fact that research flight will be increasingly dynamic in nature and will seek to set up high- 
risk/off-nominal (e.g. large angle-of-attack and sideslip) flight conditions it is deemed necessary to retain the role 
and inputs of a conventional R/C pilot towards enhancing overall mission safety. This capability to host a secondary 
controller, referred to as the safety pilot, has the following attributes. 

• The safety pilot is located externally to the MOS vehicle to provide a line-of-sight view of the GTM so as to 
afford direct observation of craft attitude when possible. If takeover is required when the GTM is beyond the 
line-of-sight, the safety pilot will use the nose camera video, tracker camera video, or a combination of the 
videos to fly the GTM back into line-of-sight. 

• The safety pilot inputs originate from a conventional handheld R/C transmitter that is configured to 
continuously emanate a special safety channel signal. 

The avionics of the GTM aircraft are engineered to receive the safety channel signal for the purpose 
of discriminating among four flight control cases: 

> ground research pilot inputs for advanced/general experiment conditions; 

> ground safety pilot inputs for basic/aborted experiment conditions; 

> onboard return-to-home autopilot inputs; 

> onboard range-safety inputs for abnormal/terminated flight conditions. 

• The safety pilot has a dedicated two-way audio communication channel with the Flight Test Director Station 
and the Research Pilot Station. 

Figure 12 illustrates how the Safety Pilot Station interfaces to the MOS. 


12 

American Institute of Aeronautics and Astronautics 


Mobile Operations 
Station (MOS) 


Direct Video Feeds 
OUT 


Safety Pilot’s 
Station 



Direct Video Feeds 
IN 


Figure 12. Safety Pilot Station Interfaces 


9.0 Major Software Components 


The GTM-GF system network architecture is shown in Figure 13. 


Ops Command Station 


Tactical Display 


( ] 


(1 






Figure 13. GTM Ground Facilities System Network Architecture 


The major software components will be contained within eight PCs and a Real-Time Simulation Computer 
(dSPACE), shown in Figure 13. There are three broad components of the overall GTM-GF software architecture, 
the simulation, the displays, and the networking component for both the BRS and the MOS. These are discussed in 
Sections 9.1 and 9.2. 


13 

American Institute of Aeronautics and Astronautics 




9.1 Simulations 


The GTM simulation is capable of being executed in both the BRS and the MOS, and will acquire pilot and 
researcher input signals and generate six Degree of Freedom (6DOF) flight dynamics and sensor data, and emulate 
the GTM down-link telemetry stream. The simulation will run on a Personal Computer (PC) platform with the 
Windows operating system, using the Matlab/Simulink programming tools. The GTM simulation is referred to, in 
this document, as the Flight Dynamics Model (FDM), and will operate in three distinct configurations: 

• Personal Computer (FDM-PC), standalone, near real-time operation, on one or more PCs; 

• Piloted Simulation (FDM-PSIM), real-time simulated operation on the dSPACE platform; 

• Hardware In the Loop (FDM-HIL) real-time operation simulation on the dSPACE platform. 

Each of these configurations is discussed below. 

FDM-PC 

The FDM-PC is a standalone Matlab/Simulink simulation operating in near real-time on a Windows -based PC 
platform. The program is configured to receive input from a selection of flight controller hardware, send flight 
parameter information to a selection of graphical visual display programs, and send emulated telemetry sensor and 
flight information to an Ethernet port in User Datagram Protocol (UDP) packets to support development and testing 
of GTM Ground Facilities software. This setup is shown below in Figure 14. 



Research 

Support 

Systems 




Figure 14. FDM-PC Configuration 


FDM-PSIM 

The FDM-PSIM operates on the dSPACE system without I/O to the GTM. This setup will receive controller input 
signals from the Research Pilot Station interfaced to the dSPACE system. Simulated telemetry data will be output 
as a UDP format and distributed through the Ethernet bus for display generation and archiving. This setup is shown 
below in Figure 15. 


14 

American Institute of Aeronautics and Astronautics 



Figure 15. FDM-PSIM Configuration 


FDM-HIL 

The FDM-HIL operates with the GTM through the dSPACE platform. This setup receives controller input signals 
from the Research Pilot Station through dSPACE input ports. It will receive telemetry signals from GTM sensors 
through dSPACE RS-422 serial input ports, and outputs command signals to the GTM telemetry system through 
dSPACE RS-422 serial output ports. Additionally, all telemetry data (uplink and downlink), and aircraft surface 
position commands from the pilot controls will be output over Ethernet through the support PC for archiving and 
display generation. The simulation will have an option for archive playback of the GTM states and controls using 
information obtained from the GTM as a means for testing the FDM simulation fidelity. This setup is shown below 
in Figure 16. 



Figure 16. FDM-HIL Simulation Configuration 


15 

American Institute of Aeronautics and Astronautics 



9.2 Networking and Displays 

Networking is an Ethernet application for distributing telemetry and control data from the control computer to 
all other computers and devices involved in the ground facilities (ref. figure 13). It is a self-contained application, 
but is used within the dSP ACE-based Simulink applications to accomplish data output to the other systems of the 
ground facilities. 

The displays will be generated by several independent software applications, which render synthetic terrain, 
GTM images, and informational graphics. These include synthetic vision underlays, primary and navigation display 
symbology, and standalone status displays. The networking applications used for displays are derived from work 
done for NASA’s Synthetic Vision research project. The Synthetic Vision-based applications developed for the 
GTM-GF are comprised of three major components: the Message Manager, the Data Server and the Display 
Manager. Each of these components is described briefly below. 

• Message Manager - the Message Manager receives the data streams from the GTM telemetry system, 
repackages this information, and sends it to the Data Server. This repackaging process is necessary since 
the amount of data and update rate of the data from the model can vary; however, the Data Server requires 
complete messages at a pre-defmed rate. 

• Data Server - the Data Server receives data from the Message Manager, performs any necessary post- 
processing on the data, and places the data into the appropriate computers’ memory. Post-processing of the 
data may consist of unit conversion (i.e., convert the data from micro-volts measured by the GTM’s sensors 
to units that are more appropriate for creating and updating the displays). The Data Server also has the 
ability to record the data that it receives. 

• Display Manager - the Display Manager reads the computers’ memory to retrieve all of the data that is 
required to produce the desired displays. These displays include the Primary Flight Display (PFD), backup 
PFD, Navigation Display (ND), and all of the research displays. 

The Message Manager is installed only on the computer that links with the GTM telemetry downlink data. The Data 
Server and Display Manager, however, are installed on all of the computers that generate displays. 


10.0 Concluding Remarks 

The GTM-GF is currently being developed and configured to support the experiment “Aerodynamic Model 
Validation For Upset Conditions”. This experiment is being conducted to validate aerodynamic modeling methods 
and data for upset conditions which include conditions beyond the normal flight envelope. These conditions include 
stalls and post-stall departures from controlled open-loop pilot flight at high angles of attack and sideslip with high 
angular rates. The GTM-GF is scheduled to support flight testing activities in Spring 2006. Subsequently, the 
GTM-GF will be configured for testing automatic closed-loop control research systems that are being developed to 
prevent vehicle upset as a result of failures, or recover the vehicle from abnormal flight conditions. 


References 

1. Belcastro, Christine M.; Belcastro, Celeste M.; “Application of Failure Detection, Identification, and 
Accommodation Methods for Improved Aircraft Safety”, Proceedings of the American Control Conference, June 
2001 

2. Belcastro, Christine M.; Belcastro, Celeste M.; “On the Validation of Safety Critical Aircraft Systems, Part I: An 
Overview of Analytical & Simulation Methods”, Proceedings of the Guidance Navigation and Control 
Conference, 2003 

3. Belcastro, Celeste M.; Belcastro, Christine M.; “On the Validation of Safety Critical Aircraft Systems, Part II: An 

Overview of Experimental Methods”, Proceedings of the Guidance Navigation and Control Conference, 2003 

4. Jordan, T. F., Fangford, W. M., Belcastro, Christine M., Foster, J. M., Shah, G. H., Howland, G., Kidd, R., 
“Development of a Dynamically Scaled Generic Transport Model Testbed for Flight Research Experiments”, 
AUVSI Unmanned Systems North America 2004, AUVSI, Arlington, VA, 2004 

5. Jordan, Thomas; Fangford, William; Hill, Jeffrey S; “Airborne Subscale Transport Aircraft Research Testbed - 
Aircraft Model Development”, Proceedings of the Guidance Navigation and Control Conference, 2005 


16 

American Institute of Aeronautics and Astronautics 



6. Belcastro, Celeste M.;” Overview of the Systems and Airframe Failure Emulation Testing & Integration 
(SAFETI) Laboratory at the NASA Langley Research Center”, Proceedings of the International Conference on 
Lightning and Static Electricity, 2001 

7. Bailey, Roger M., Hueschen, Richard M.; “Functional Requirements for Development of Ground Facilities 
Supporting the Generic Transport Model”, Rev. 1.1, NASA Internal Report, April 30, 2004 

8. Hostetler, Robert; “Generic Transport Model Ground Facilities Software Requirements Specification (SRS) ” v. 

2.0; NASA Internal Report, 29 July 2005 

9. Barnes, Kevin N.; “AirSTAR MOBILE OPERATION STATION PROCUREMENT REQUIREMENTS 
DOCUMENT”, rev 1.0, NASA Internal Report, 05April 2005 

10. “GTM Communication Link Messages”, Microbotics, Inc., July 27, 2005 

11. “PCM System, NASA LaRC GTM Custom Build”, Microbotics, Inc., April 27, 2004 

12. “MIDG II Message Specification”, Microbotics, Inc., May 26, 2005 


17 

American Institute of Aeronautics and Astronautics